Instagram’s recommendation algorithms linked and even promoted a “vast pedophile network” that advertised the sale of illicit “child-sex material” on the platform, according to the findings of an alarming report.
Instagram allowed users to search by hashtags related to child-sex abuse, including graphic terms such as #pedowhore, #preteensex, #pedobait and #mnsfw — the latter an acronym meaning “minors not safe for work,” researchers at Stanford University and the University of Massachusetts Amherst told the Wall Street Journal.
The hashtags directed users to accounts that purportedly offered to sell pedophilic materials via “menus” of content, including videos of children harming themselves or committing acts of bestiality, the researchers said.
Some accounts allowed buyers to “commission specific acts” or arrange “meet ups,” the Journal said. More at NYPost
Extremely concerning pic.twitter.com/PLz5jrocvv
— Elon Musk (@elonmusk) June 7, 2023